Deep Fakes: Artificial Intelligence (AI) and Deep Fakes have presented many challenges in the digital world. From legends like Sachin Tendulkar and Ratan Tata to many Bollywood and business personalities have become its victims. Now it is feared that the general elections to be held in India can be influenced through AI and deepfakes. In such a situation, big tech companies have geared up to prevent their misuse in elections. All of them will work together to keep the democratic process free from the influence of technology.
Elections are going to be held in about 50 countries
Elections are going to be held in about 50 countries in the year 2024. A large part of the world’s total population is going to elect their governments this year. In such a situation, these tech companies have taken this decision to prevent misuse of AI. This agreement was announced during the Munich Security Conference. It was told that Adobe, Amazon, Google, IBM, Meta, Microsoft, Open AI, TikTok and X will together stop such content. AI cannot be used to influence elections.
Deepfakes will stop videos, audios and photos
In the conference, concern was expressed about deepfake videos, audios and photos created by AI. With its help, voters can be influenced. Voters will not be able to understand whether this content is real or fake. This agreement is being considered a big step against the misuse of AI in politics. In this agreement, chatbot developing companies Anthropic and Inflection AI, voice clone startup Eleven Labs, chip designer Arm Holdings, security company McAfee and Trend Micro. Have also participated.
Till companies unite for free and fair elections
Meta’s Global Affairs President Nick Clegg said that it is very important to pay attention to the challenges arising from AI. No single tech company, government or civil society organization can fight this war. Therefore it is important that everyone works together. Coming together with big companies will help us deal with misuse of AI and deepfakes. We all want free and fair elections.
read this also